527 research outputs found
Lost in translation: data integration tools meet the Semantic Web (experiences from the Ondex project)
More information is now being published in machine processable form on the
web and, as de-facto distributed knowledge bases are materializing, partly
encouraged by the vision of the Semantic Web, the focus is shifting from the
publication of this information to its consumption. Platforms for data
integration, visualization and analysis that are based on a graph
representation of information appear first candidates to be consumers of
web-based information that is readily expressible as graphs. The question is
whether the adoption of these platforms to information available on the
Semantic Web requires some adaptation of their data structures and semantics.
Ondex is a network-based data integration, analysis and visualization platform
which has been developed in a Life Sciences context. A number of features,
including semantic annotation via ontologies and an attention to provenance and
evidence, make this an ideal candidate to consume Semantic Web information, as
well as a prototype for the application of network analysis tools in this
context. By analyzing the Ondex data structure and its usage, we have found a
set of discrepancies and errors arising from the semantic mismatch between a
procedural approach to network analysis and the implications of a web-based
representation of information. We report in the paper on the simple methodology
that we have adopted to conduct such analysis, and on issues that we have found
which may be relevant for a range of similar platformsComment: Presented at DEIT, Data Engineering and Internet Technology, 2011
IEEE: CFP1113L-CD
On Factor Universality in Symbolic Spaces
The study of factoring relations between subshifts or cellular automata is
central in symbolic dynamics. Besides, a notion of intrinsic universality for
cellular automata based on an operation of rescaling is receiving more and more
attention in the literature. In this paper, we propose to study the factoring
relation up to rescalings, and ask for the existence of universal objects for
that simulation relation. In classical simulations of a system S by a system T,
the simulation takes place on a specific subset of configurations of T
depending on S (this is the case for intrinsic universality). Our setting,
however, asks for every configurations of T to have a meaningful interpretation
in S. Despite this strong requirement, we show that there exists a cellular
automaton able to simulate any other in a large class containing arbitrarily
complex ones. We also consider the case of subshifts and, using arguments from
recursion theory, we give negative results about the existence of universal
objects in some classes
Processing second-order stochastic dominance models using cutting-plane representations
This is the post-print version of the Article. The official published version can be accessed from the links below. Copyright @ 2011 Springer-VerlagSecond-order stochastic dominance (SSD) is widely recognised as an important decision criterion in portfolio selection. Unfortunately, stochastic dominance models are known to be very demanding from a computational point of view. In this paper we consider two classes of models which use SSD as a choice criterion. The first, proposed by Dentcheva and Ruszczyński (J Bank Finance 30:433–451, 2006), uses a SSD constraint, which can be expressed as integrated chance constraints (ICCs). The second, proposed by Roman et al. (Math Program, Ser B 108:541–569, 2006) uses SSD through a multi-objective formulation with CVaR objectives. Cutting plane representations and algorithms were proposed by Klein Haneveld and Van der Vlerk (Comput Manage Sci 3:245–269, 2006) for ICCs, and by Künzi-Bay and Mayer (Comput Manage Sci 3:3–27, 2006) for CVaR minimization. These concepts are taken into consideration to propose representations and solution methods for the above class of SSD based models. We describe a cutting plane based solution algorithm and outline implementation details. A computational study is presented, which demonstrates the effectiveness and the scale-up properties of the solution algorithm, as applied to the SSD model of Roman et al. (Math Program, Ser B 108:541–569, 2006).This study was funded by OTKA, Hungarian
National Fund for Scientific Research, project 47340; by Mobile Innovation Centre, Budapest University of Technology, project 2.2; Optirisk Systems, Uxbridge, UK and by BRIEF (Brunel University Research Innovation and Enterprise Fund)
Local Hidden Variables Underpinning of Entanglement and Teleportation
Entangled states whose Wigner functions are non-negative may be viewed as
being accounted for by local hidden variables (LHV). Recently, there were
studies of Bell's inequality violation (BIQV) for such states in conjunction
with the well known theorem of Bell that precludes BIQV for theories that have
LHV underpinning. We extend these studies to teleportation which is also based
on entanglement. We investigate if, to what extent, and under what conditions
may teleportation be accounted for via LHV theory. Our study allows us to
expose the role of various quantum requirements. These are, e.g., the
uncertainty relation among non-commuting operators, and the no-cloning theorem
which forces the complete elimination of the teleported state at its initial
port.Comment: 24 pages, 1 figure, accepted Found. Phy
Warm stellar matter with deconfinement: application to compact stars
We investigate the properties of mixed stars formed by hadronic and quark
matter in -equilibrium described by appropriate equations of state (EOS)
in the framework of relativistic mean-field theory. We use the non- linear
Walecka model for the hadron matter and the MIT Bag and the Nambu-Jona-Lasinio
models for the quark matter. The phase transition to a deconfined quark phase
is investigated. In particular, we study the dependence of the onset of a mixed
phase and a pure quark phase on the hyperon couplings, quark model and
properties of the hadronic model. We calculate the strangeness fraction with
baryonic density for the different EOS. With the NJL model the strangeness
content in the mixed phase decreases. The calculations were performed for T=0
and for finite temperatures in order to describe neutron and proto-neutron
stars. The star properties are discussed. Both the Bag model and the NJL model
predict a mixed phase in the interior of the star. Maximum allowed masses for
proto-neutron stars are larger for the NJL model ( M)
than for the Bag model ( M).Comment: RevTeX,14 figures, accepted to publication in Physical Review
Towards Machine Wald
The past century has seen a steady increase in the need of estimating and
predicting complex systems and making (possibly critical) decisions with
limited information. Although computers have made possible the numerical
evaluation of sophisticated statistical models, these models are still designed
\emph{by humans} because there is currently no known recipe or algorithm for
dividing the design of a statistical model into a sequence of arithmetic
operations. Indeed enabling computers to \emph{think} as \emph{humans} have the
ability to do when faced with uncertainty is challenging in several major ways:
(1) Finding optimal statistical models remains to be formulated as a well posed
problem when information on the system of interest is incomplete and comes in
the form of a complex combination of sample data, partial knowledge of
constitutive relations and a limited description of the distribution of input
random variables. (2) The space of admissible scenarios along with the space of
relevant information, assumptions, and/or beliefs, tend to be infinite
dimensional, whereas calculus on a computer is necessarily discrete and finite.
With this purpose, this paper explores the foundations of a rigorous framework
for the scientific computation of optimal statistical estimators/models and
reviews their connections with Decision Theory, Machine Learning, Bayesian
Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty
Quantification and Information Based Complexity.Comment: 37 page
Global stability for a class of virus models with CTL immune response and antigenic variation
We study the global stability of a class of models for in-vivo virus
dynamics, that take into account the CTL immune response and display antigenic
variation. This class includes a number of models that have been extensively
used to model HIV dynamics. We show that models in this class are globally
asymptotically stable, under mild hypothesis, by using appropriate Lyapunov
functions. We also characterise the stable equilibrium points for the entire
biologically relevant parameter range. As a byproduct, we are able to determine
what is the diversity of the persistent strains.Comment: 15 page
Observation of hard scattering in photoproduction events with a large rapidity gap at HERA
Events with a large rapidity gap and total transverse energy greater than 5
GeV have been observed in quasi-real photoproduction at HERA with the ZEUS
detector. The distribution of these events as a function of the
centre of mass energy is consistent with diffractive scattering. For total
transverse energies above 12 GeV, the hadronic final states show predominantly
a two-jet structure with each jet having a transverse energy greater than 4
GeV. For the two-jet events, little energy flow is found outside the jets. This
observation is consistent with the hard scattering of a quasi-real photon with
a colourless object in the proton.Comment: 19 pages, latex, 4 figures appended as uuencoded fil
Search for direct production of charginos and neutralinos in events with three leptons and missing transverse momentum in √s = 7 TeV pp collisions with the ATLAS detector
A search for the direct production of charginos and neutralinos in final states with three electrons or muons and missing transverse momentum is presented. The analysis is based on 4.7 fb−1 of proton–proton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with Standard Model expectations in three signal regions that are either depleted or enriched in Z-boson decays. Upper limits at 95% confidence level are set in R-parity conserving phenomenological minimal supersymmetric models and in simplified models, significantly extending previous results
Ten millennia of hepatitis B virus evolution
Hepatitis B virus (HBV) has been infecting humans for millennia and remains a global health problem, but its past diversity and dispersal routes are largely unknown. We generated HBV genomic data from 137 Eurasians and Native Americans dated between ~10,500 and ~400 years ago. We date the most recent common ancestor of all HBV lineages to between ~20,000 and 12,000 years ago, with the virus present in European and South American hunter-gatherers during the early Holocene. After the European Neolithic transition, Mesolithic HBV strains were replaced by a lineage likely disseminated by early farmers that prevailed throughout western Eurasia for ~4000 years, declining around the end of the 2nd millennium BCE. The only remnant of this prehistoric HBV diversity is the rare genotype G, which appears to have reemerged during the HIV pandemic
- …